# Foundation Model
Consilience 40b J4iA6BRm
Nous Consilience 40B is a 40-billion-parameter generative text model pretrained from scratch in a decentralized manner, supporting multiple languages and designed to represent the broad spectrum of human creative output.
Large Language Model Supports Multiple Languages
C
PsycheFoundation
46
1
Timemoe 200M
Apache-2.0
TimeMoE-200M is a billion-scale time series foundation model based on the Mixture of Experts (MoE) architecture, focusing on time series forecasting tasks.
Climate Model
T
Maple728
14.01k
7
Featured Recommended AI Models